We present a simple transformation of any linear program or semidefiniteprogram into an equivalent convex optimization problem whose only constraintsare linear equations. The objective function is defined on the whole space,making virtually all subgradient methods be immediately applicable. We observe,moreover, that the objective function is naturally smoothed, thereby allowingmost first-order methods to be applied. We develop complexity bounds in the unsmoothed case for a particularsubgradient method, and in the smoothed case for Nesterov's original optimalfirst-order method for smooth functions. We achieve the desired bounds on thenumber of iterations, $ O(1/ \epsilon^2) $ and $ O(1/ \epsilon) $,respectively. Perhaps most surprising is that the transformation from a linear program or asemidefinite program is simple and so is the basic theory, and yet the approachhas been overlooked until now, a blind spot. Once the transformation isrealized, the remaining effort in establishing complexity bounds is mainlystraightforward, by making use of various works of Nesterov.
展开▼